Journal article
Adversarial Distillation for Learning with Privileged Provisions
Xiaojie Wang, Rui Zhang, Yu Sun, Jianzhong Qi
IEEE Transactions on Pattern Analysis and Machine Intelligence | Institute of Electrical and Electronics Engineers (IEEE) | Published : 2021
Abstract
Knowledge distillation aims to train a student (model) for accurate inference in a resource-constrained environment. Traditionally, the student is trained by a high-capacity teacher (model) whose training is resource-intensive. The student trained this way is suboptimal because it is difficult to learn the real data distribution from the teacher. To address this issue, we propose to train the student against a discriminator in a minimax game. Such a minimax game has an issue that it can take an excessively long time for the training to converge. To address this issue, we propose adversarial distillation consisting of a student, a teacher, and a discriminator. The discriminator is now a multi..
View full abstractGrants
Awarded by Australian Research Council
Funding Acknowledgements
This work is supported by Australian Research Council Future Fellowship Project FT120100832 and Discovery Project DP180102050.